TokRepo
AccueilTendancesTutorielsAuteurs
←Retour aux auteurs
LiteLLM (BerriAI)

LiteLLM (BerriAI)

Inscrit en mars 2026
3 actifs·0 étoiles obtenues·27 vues totales
⚡

Workflows

1

LiteLLM Proxy — Unified Gateway for 100+ LLM APIs

LiteLLM Proxy maps 100+ LLM providers (Anthropic, OpenAI, Bedrock, Vertex) to one OpenAI-compatible endpoint. Auth, rate limit, cost track, fallbacks.

May 7, 2026
7
📜

Scripts

1

LiteLLM Router — Smart Failover & Load Balancing in Python

LiteLLM Router routes LLM endpoints with retry, fallback, latency-based, weighted A/B. Pure Python — drop into any codebase, no separate proxy needed.

May 7, 2026
6
📚

Knowledge

1

LiteLLM Cost Tracking — Per-Project LLM Spend Dashboard

LiteLLM ships a built-in cost dashboard. Track LLM spend by project, user, model, tag. Hard budgets that block at the proxy. SOC2 / SSO via Pro tier.

May 7, 2026
14
◈Accueil🔍Rechercher👤Moi
TokRepo

© 2026 TokRepo. Tous droits réservés.

TutorielsÀ proposConfidentialitéAideTwitter

軒轅十四株式会社 · Tokyo, Japan

〒101-0032 Tokyo, Chiyoda-ku, Iwamotocho 2-chome

Contact: ethanfrostcool@gmail.com